Create a Medical Image Annotation Job

Files Submitted

Criteria Meet Specification

Are all required files submitted?

The submission zip file includes a complete Instructions_Preview.html and Proposal file (a pdf).

Instructions: Annotator Instructions & Examples

Criteria Meet Specification

Are your instructions complete?

The Instructions html file includes the following sections:

  1. Overview
  2. Steps
  3. Rules & Tips
  4. Examples
  5. (Untitled) Visible Test Questions, which show the general layout of what an annotator will see during this job; no answers need to be selected.

Do the Overview and Steps sections clearly indicate your data annotation goals?

Your Overview should briefly describe why you are creating the data annotation job, and the Steps should clearly explain what is expected of an annotator.

Does the Rules & Tips section describe each kind of data label?

All possible labels/answers should be clearly defined in the Rules & Tips section. If selecting a certain label is not obvious, it is best practice to add clarifying criteria, here.

Is there a corresponding example for every label?

The Examples section should include at least one example for each possible data label.

Proposal: Design & Quality Assurance

Criteria Meet Specification

Does your proposal describe the industry problem you are trying to solve?

Include a short overview of this project and the product goal.

Do you justify your choice of data labels?

Explain why you chose the labeling scheme that you did. What are the strengths and weaknesses of such a labeling scheme?

Did you specify the number of test questions you'd like to include for this small dataset?

You should plan to include at least 5% test questions to mix into your training set or about 1 test question for every 19 data points you want to label.

Have you answered the questions regarding how you might plan to evaluate the quality of test questions?

Your test questions may not be perfect; you should have a plan for revisiting their efficacy. You should answer the provided questions about how you might handle scenarios in which multiple annotators contest or fail a test question.

Did you include a discussion of potential weaknesses and areas of improvement?

You should include a description of what might be missing in your data annotation job and what might be improved. You should provide answers to the following questions:

  • Could the data source be improved? If so, how?
  • Do you think your test questions or labeling scheme could be improved? And how do you think you could ensure the quality of data over time?

You might also consider involving stakeholders (engineers, medical professionals, etc.) in discussions about product improvements.

Do you describe how you might respond to contributor feedback?

You should note which areas of your instructions and test questions you might improve according to annotator feedback.

Tips to make your project standout:

  • Design labels for all scenarios; the best annotations and test questions should be designed to handle failure cases.
  • As you are going through this project, you are encouraged to think about how the Figure Eight platform might be improved; it is often the job of a Product Manager to keep the user in mind and design for the best experience, and this is a good thought exercise in what can be really great or challenging about an existing platform.